AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient with small parameters

# Efficient with small parameters

Qwen2.5 1.5B S1k 1.1
This model is a text generation model fine-tuned based on Qwen/Qwen2.5-1.5B-Instruct. It is trained using TRL and provides strong support for text generation tasks.
Large Language Model Transformers
Q
rvindra
1,312
1
Gemma 2 2b It Chinese Kyara Dpo
Kyara is a language model fine-tuning project enhanced by knowledge retrieval, focusing on improving the model's performance on languages with fewer resources such as Traditional Chinese.
Large Language Model Transformers Supports Multiple Languages
G
zake7749
2,334
13
Tinyllama 110M
MIT
This is a 110-million-parameter Llama 2 architecture model trained on the TinyStories dataset, suitable for lightweight text generation tasks.
Large Language Model Transformers
T
nickypro
1,472
5
Tiny Starcoder Py
Openrail
This is a Python code generation model with 164 million parameters, based on the StarCoder architecture, specifically optimized for Python code generation tasks.
Large Language Model Transformers
T
bigcode
1,886
74
Gpt Neo 125M Code Search Py
A Python code auto-completion model fine-tuned on GPT-Neo-125M, specializing in method completion tasks
Large Language Model
G
flax-community
17
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase